Bounded perturbation resilience of projected scaled gradient methods
نویسندگان
چکیده
منابع مشابه
Bounded perturbation resilience of projected scaled gradient methods
We investigate projected scaled gradient (PSG) methods for convex minimization problems. These methods perform a descent step along a diagonally scaled gradient direction followed by a feasibility regaining step via orthogonal projection onto the constraint set. This constitutes a generalized algorithmic structure that encompasses as special cases the gradient projection method, the projected N...
متن کاملBounded perturbation resilience of extragradient-type methods and their applications
In this paper we study the bounded perturbation resilience of the extragradient and the subgradient extragradient methods for solving a variational inequality (VI) problem in real Hilbert spaces. This is an important property of algorithms which guarantees the convergence of the scheme under summable errors, meaning that an inexact version of the methods can also be considered. Moreover, once a...
متن کاملSpectral Projected Gradient Methods
The poor practical behavior of (1)-(2) has been known for many years. If the level sets of f resemble long valleys, the sequence {xk} displays a typical zig-zagging trajectory and the speed of convergence is very slow. In the simplest case, in which f is a strictly convex quadratic, the method converges to the solution with a Q-linear rate of convergence whose factor tends to 1 when the conditi...
متن کاملProjected Gradient Methods for Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) can be formulated as a minimization problem with bound constraints. Although bound-constrained optimization has been studied extensively in both theory and practice, so far no study has formally applied its techniques to NMF. In this letter, we propose two projected gradient methods for NMF, both of which exhibit strong optimization properties. We discuss ...
متن کاملSpectral Projected Gradient methods: Review and Perspectives
Over the last two decades, it has been observed that using the gradient vector as a search direction in large-scale optimization may lead to efficient algorithms. The effectiveness relies on choosing the step lengths according to novel ideas that are related to the spectrum of the underlying local Hessian rather than related to the standard decrease in the objective function. A review of these ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Optimization and Applications
سال: 2015
ISSN: 0926-6003,1573-2894
DOI: 10.1007/s10589-015-9777-x